B Aggregation of Convex-Combination Influence Models

ثبت نشده
چکیده

Frequent visitors to the website, Rotten Tomatoes, may be surprised to discover that a movie with a 15% positive rating is not necessarily better reviewed than than one which gets, say, 45% positive. This is because dichotomizing reviews as either just good or bad, period, discards lots of information. A strong consensus that a film is not great (say, mostly C-) will result in a low percentage of positives. On the other hand, a movie with a lot of terrible reviews but a strong lukewarm contingent (say, 55% Fs and Ds, 45% C+) can have a much higher “positive” percentage. While the subject-matter is much less important, this example has exactly the structure of voters located on a one-dimensional latent political space being forced to chose between two candidates. More generally, this points to the importance of uncertainty assessments. Even if one were to accept all of Groseclose’s modeling assumptions, the reader of Left Turn has no idea of what the views of the hypothetical unbiased American voter are. The reader knows Groseclose’s point estimate (≈ 25, as we said), but this estimate, like every other, is uncertain and surrounded by some margin of error. Since it is a function of earlier estimates, themselves imprecise7, the obvious and correct thing to do is to propagate the uncertainty. But Left Turn gives no sense of how much uncertainty attaches to its claims, even of a purely statistical sort. Equally seriously, Nyhan [2005] and Gasper [2011] have shown that estimates from Groseclose’s first, ideal-point model are highly sensitive to details of which data are included in the analysis, adding systematic to statistical uncertainty. We do not know how big the margin of error for Groseclose’s estimate of the location of Unbiased America should be, but it surely matters if a good confidence interval is [22, 28] or [10, 90], or indeed [0, 100].

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Hermite-Hadamard inequalities for $mathbb{B}$-convex and $mathbb{B}^{-1}$-convex functions

Hermite-Hadamard inequality is one of the fundamental applications of convex functions in Theory of Inequality. In this paper, Hermite-Hadamard inequalities for $mathbb{B}$-convex and $mathbb{B}^{-1}$-convex functions are proven.

متن کامل

Linear and convex aggregation of density estimators

We study the problem of learning the best linear and convex combination of M estimators of a density with respect to the mean squared risk. We suggest aggregation procedures and we prove sharp oracle inequalities for their risks, i.e., oracle inequalities with leading constant 1. We also obtain lower bounds showing that these procedures attain optimal rates of aggregation. As an example, we con...

متن کامل

A NEW APPROACH OF FUZZY NUMBERS WITH DIFFERENT SHAPES AND DEVIATION

In this paper, we propose a new method for fuzzy numbers. In this method, we assume that Ai= (ai1, ai2, ai3, ai4) is to be a fuzzy number. So, the convex combination of ai1 and ai2 and also the convex combination of ai3 and ai4 are obtained separately. Then, Mic and Mis that are to be the convex combinations and the standard deviation respectively we acquire them from these components. Finally,...

متن کامل

Constraint aggregation principle in convex optimization

A general constraint aggregation technique is proposed for convex optimization problems. At each iteration a set of convex inequalities and linear equations is replaced by a single inequality formed as a linear combination of the original constraints. After solving the simplified subproblem, new aggregation coefficients are calculated and the iteration continues. This general aggregation princi...

متن کامل

Sharp oracle bounds for monotone and convex regression through aggregation

We derive oracle inequalities for the problems of isotonic and convex regression using the combination of Q-aggregation procedure and sparsity pattern aggregation. This improves upon the previous results including the oracle inequalities for the constrained least squares estimator. One of the improvements is that our oracle inequalities are sharp, i.e., with leading constant 1. It allows us to ...

متن کامل

Speech Enhancement by Modified Convex Combination of Fractional Adaptive Filtering

This paper presents new adaptive filtering techniques used in speech enhancement system. Adaptive filtering schemes are subjected to different trade-offs regarding their steady-state misadjustment, speed of convergence, and tracking performance. Fractional Least-Mean-Square (FLMS) is a new adaptive algorithm which has better performance than the conventional LMS algorithm. Normalization of LMS ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2012